Advances in Metaheuristics for Hard Optimization - ReadingSample
نویسندگان
چکیده
Thecontinuous Constrained Optimization Problem (COP) often occurs in industrial applications. In this study, we compare three novel algorithms developed for solving the COP.The first approach consists of an Interval Partitioning Algorithm (IPA) that is exhaustive in covering the whole feasible space. IPA has the capability of discarding sub-spaces that are sub-optimal and/or infeasible, similar to available Branch and Bound techniques. The difference of IPA lies in its use of Interval Arithmetic rather than conventional bounding techniques described in the literature. The second approach tested here is the novel dual-sequence Simulated Annealing (SA) algorithm that eliminates the use of penalties for constraint handling. Here, we also introduce a hybrid algorithm that integrates SA in IPA (IPA-SA) and compare its performance with stand-alone SA and IPA algorithms. All three methods have a local COP solver, Feasible Sequential Quadratic Programming (FSQP) incorporated so as to identify feasible stationary points. The performances of these three methods are tested on a suite of COP benchmarks and the results are discussed.
منابع مشابه
Advances in Metaheuristics for Hard Optimization
Springer ISBN-10: 3642092063 ISBN-13: 978-3642092060 Paperback 481 pages 2010 Many advances have been made recently in metaheuristic methods, from theory to applications. The community of researchers claiming the relevance of their work to the field of metaheuristics is growing faster and faster, despite the fact that the term itself has not been precisely defined. Numerous books have been publ...
متن کاملParallel Local Search
Local search metaheuristics are a recognized means of solving hard combinatorial problems. Over the last couple of decades, significant advances have been made in terms of the formalization, applicability and performance of these methods. Key to the performance aspect is the increased availability of parallel hardware, which turns out to be largely exploitable by this class of procedures. As re...
متن کاملHidden Markov Models Training Using Population-based Metaheuristics
In this chapter, we consider the issue of Hidden Markov Model (HMM) training. First, HMMs are introduced and then we focus on the particular HMMs training problem. We emphasize the difficulty of this problem and present various criteria that can be considered. Many different adaptations of metaheuristics have already been used but, until now, a few extensive comparisons have been performed on t...
متن کاملLearning Structure Illuminates Black Boxes - An Introduction to Estimation of Distribution Algorithms
This chapter serves as an introduction to estimation of distribution algorithms (EDAs). Estimation of distribution algorithms are a new paradigm in evolutionary computation. They combine statistical learning with population-based search in order to automatically identify and exploit certain structural properties of optimization problems. State-of-the-art EDAs consistently outperform classical g...
متن کامل